Learning Infinite Hidden Relational Models

نویسندگان

  • Zhao Xu
  • Volker Tresp
  • Kai Yu
  • Hans-Peter Kriegel
چکیده

Relational learning analyzes the probabilistic constraints between the attributes of entities and relationships. We extend the expressiveness of relational models by introducing for each entity (or object) an infinite-state latent variable as part of a Dirichlet process (DP) mixture model. It can be viewed as a relational generalization of hidden Markov random field. The information propagates in the intern-connected network via latent variables, reducing the necessary for extensive structure learning. For inference, we explore a Gibbs sampling method based on the Chinese restaurant process. The performance of our model is demonstrated in three applications: the movie recommendation, the function prediction of genes and a medical recommendation system.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Inference in Infinite Hidden Relational Models

Relational learning (Dzeroski & Lavrac, 2001; Friedman et al., 1999; Raedt & Kersting, 2003) is an area of growing interest in machine learning. Xu et al. (2006) introduced the infinite hidden relational model (IHRM) which views relational learning in context of the entity-relationship database model with entities, attributes and relations (compare also (Kemp et al., 2006)). In the IHRM, for ea...

متن کامل

Social Network Mining with Nonparametric Relational Models

Statistical relational learning (SRL) provides effective techniques to analyze social network data with rich collections of objects and complex networks. Infinite hidden relational models (IHRMs) introduce nonparametric mixture models into relational learning and have been successful in many relational applications. In this paper we explore the modeling and analysis of complex social networks w...

متن کامل

Infinite Hidden Relational Models

Relational learning analyzes the probabilistic constraints between the attributes of entities and relationships. We extend the expressiveness of relational models by introducing for each entity (or object) an infinitedimensional latent variable as part of a Dirichlet process (DP) mixture model. We discuss inference in the model, which is based on a DP Gibbs sampler, i.e., the Chinese restaurant...

متن کامل

A statistical relational model for trust learning

We address the learning of trust based on past observations and context information. We argue that from the truster’s point of view trust is best expressed as one of several relations that exist between the agent to be trusted (trustee) and the state of the environment. Besides attributes expressing trustworthiness, additional relations might describe commitments made by the trustee with regard...

متن کامل

Optimizing Probabilistic Models for Relational Sequence Learning

This paper tackles the problem of relational sequence learning selecting relevant features elicited from a set of labelled sequences. Each relational sequence is firstly mapped into a feature vector using the result of a feature construction method. The second step finds an optimal subset of the constructed features that leads to high classification accuracy, by adopting a wrapper approach that...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006